Convergence analysis of herded-Gibbs-type sampling algorithms: effects of weight sharing

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Herded Gibbs Sampling

The Gibbs sampler is one of the most popular algorithms for inference in statistical models. In this paper, we introduce a herding variant of this algorithm, called herded Gibbs, that is entirely deterministic. We prove that herded Gibbs has an O(1/T ) convergence rate for models with independent variables and for fully connected probabilistic graphical models. Herded Gibbs is shown to outperfo...

متن کامل

Rates of Convergence for Gibbs Sampling for Variance Component Models

This paper analyzes the Gibbs sampler applied to a standard variance component model, and considers the question of how many iterations are required for convergence. It is proved that for K location parameters, with J observations each, the number of iterations required for convergence (for large K and J) is a constant times 1 + log K log J. This is one of the rst rigorous, a priori results abo...

متن کامل

woman-defined identity: analysis of selected poems of adrienne rich

the current thesis is composed in five chapters in the following fashion: chapter two encompasses the applied framework of the project in details; the methodology of carl gustav jung to explain the process of individuation, the major archetypes and their attributes and his techniques to assess the mind’s strata are all explained. moreover, the austrian psychoanalysts, heinz kohut’s models of a...

Empirical Analysis of the Divergence of Gibbs Sampling Based Learning Algorithms for Restricted Boltzmann Machines

Learning algorithms relying on Gibbs sampling based stochastic approximations of the log-likelihood gradient have become a common way to train Restricted Boltzmann Machines (RBMs). We study three of these methods, Contrastive Divergence (CD) and its refined variants Persistent CD (PCD) and Fast PCD (FPCD). As the approximations are biased, the maximum of the log-likelihood is not necessarily ob...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Statistics and Computing

سال: 2019

ISSN: 0960-3174,1573-1375

DOI: 10.1007/s11222-019-09852-6